Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 214
Filter
1.
Clin Transplant ; 38(4): e15316, 2024 04.
Article in English | MEDLINE | ID: mdl-38607291

ABSTRACT

BACKGROUND: The incidence of graft failure following liver transplantation (LTx) is consistent. While traditional risk scores for LTx have limited accuracy, the potential of machine learning (ML) in this area remains uncertain, despite its promise in other transplant domains. This study aims to determine ML's predictive limitations in LTx by replicating methods used in previous heart transplant research. METHODS: This study utilized the UNOS STAR database, selecting 64,384 adult patients who underwent LTx between 2010 and 2020. Gradient boosting models (XGBoost and LightGBM) were used to predict 14, 30, and 90-day graft failure compared to conventional logistic regression model. Models were evaluated using both shuffled and rolling cross-validation (CV) methodologies. Model performance was assessed using the AUC across validation iterations. RESULTS: In a study comparing predictive models for 14-day, 30-day and 90-day graft survival, LightGBM consistently outperformed other models, achieving the highest AUC of.740,.722, and.700 in shuffled CV methods. However, in rolling CV the accuracy of the model declined across every ML algorithm. The analysis revealed influential factors for graft survival prediction across all models, including total bilirubin, medical condition, recipient age, and donor AST, among others. Several features like donor age and recipient diabetes history were important in two out of three models. CONCLUSIONS: LightGBM enhances short-term graft survival predictions post-LTx. However, due to changing medical practices and selection criteria, continuous model evaluation is essential. Future studies should focus on temporal variations, clinical implications, and ensure model transparency for broader medical utility.


Subject(s)
Liver Transplantation , Adult , Humans , Liver Transplantation/adverse effects , Research Design , Algorithms , Bilirubin , Machine Learning
3.
Surgery ; 2024 Apr 11.
Article in English | MEDLINE | ID: mdl-38609786

ABSTRACT

BACKGROUND: The impact of county-level food access on mortality associated with steatotic liver disease, as well as post-liver transplant outcomes among individuals with steatotic liver disease, have not been characterized. METHODS: Data on steatotic liver disease-related mortality and outcomes of liver transplant recipients with steatotic liver disease between 2010 and 2020 were obtained from the Centers for Disease Control Prevention mortality as well as the Scientific Registry of Transplant Recipients databases. These data were linked to the food desert score, defined as the proportion of the total population in each county characterized as having both low income and limited access to grocery stores. RESULTS: Among 2,710 counties included in the analytic cohort, median steatotic liver disease-related mortality was 27.3 per 100,000 population (interquartile range 24.9-32.1). Of note, patients residing in counties with high steatotic liver disease death rates were more likely to have higher food desert scores (low: 5.0, interquartile range 3.1-7.8 vs moderate: 6.1, interquartile range, 3.8-9.3 vs high: 7.6, interquartile range 4.1-11.7). Among 28,710 patients who did undergo liver transplantation, 5,310 (18.4%) individuals lived in counties with a high food desert score. Liver transplant recipients who resided in counties with the worst food access were more likely to have a higher body mass index (>35 kg/m2: low food desert score, 17.3% vs highest food desert score, 20.1%). After transplantation, there was no difference in 2-year graft survival relative to county-level food access (food desert score: low: 88.4% vs high: 88.6%; P = .77). CONCLUSION: Poor food access was associated with a higher incidence rate of steatotic liver disease-related death, as well as lower utilization of liver transplants. On the other hand, among patients who did receive a liver transplant, there was no difference in 2-year graft survival regardless of food access strata. Policy initiatives should target the expansion of transplantation services to vulnerable communities in which there is a high mortality of steatotic liver disease.

4.
Liver Transpl ; 2024 Apr 17.
Article in English | MEDLINE | ID: mdl-38625836

ABSTRACT

BACKGROUND: The use of older donors after circulatory death(DCD) for liver transplantation(LT) has increased over the past decade. This study examined whether outcomes of LT using older DCD(≥50 y) have improved with advancements in surgical/perioperative care and normothermic machine perfusion(NMP) technology. METHOD: 7,602 DCD LT cases from the UNOS database(2003-2022) were reviewed. The impact of older DCD donors on graft survival(GS) was assessed using Kaplan-Meier and hazard ratio(HR) analyses. RESULTS: 1,447 LT cases(19.0%) involved older DCD donors. Although there was a decrease in their use from 2003-2014, a resurgence was noted post-2015 and reached 21.9% of all LT in the last four years(2019-2022). Initially, 90-day and one-year GS for older DCDs were worse than younger DCDs, but this difference decreased over time and there was no statistical difference after 2015. Similarly, HRs for graft loss in older DCD have recently become insignificant. In older DCD LT, NMP usage has increased recently, especially in cases with extended donor-recipient distances, while the median time from asystole to aortic cross-clamp has decreased. Multivariable Cox regression analyses revealed that in the early phase, asystole to cross-clamp time had the highest HR for graft loss in older DCD LT without NMP, while in the later phases, the CIT(>5.5 h) was a significant predictor. CONCLUSION: LT outcomes using older DCD donors have become comparable to those from young DCD donors, with recent HRs for graft loss becoming insignificant. The strategic approach in the recent period could mitigate risks, including managing CIT(≤5.5 h), reducing asystole to cross-clamp time, and adopting NMP for longer distances. Optimal use of older DCD donors may alleviate the donor shortage.

5.
Clin Transplant ; 38(4): e15290, 2024 04.
Article in English | MEDLINE | ID: mdl-38545890

ABSTRACT

BACKGROUND: Over the last decade there has been a surge in overdose deaths due to the opioid crisis. We sought to characterize the temporal change in overdose donor (OD) use in liver transplantation (LT), as well as associated post-LT outcomes, relative to the COVID-19 era. METHODS: LT candidates and donors listed between January 2016 and September 2022 were identified from the Scientific Registry of Transplant Recipients database. Trends in LT donors and changes related to OD were assessed pre- versus post-COVID-19 (February 2020). RESULTS: Between 2016 and 2022, most counties in the United States experienced an increase in overdose-related deaths (n = 1284, 92.3%) with many counties (n = 458, 32.9%) having more than a doubling in drug overdose deaths. Concurrently, there was an 11.2% increase in overall donors, including a 41.7% increase in the number of donors who died from drug overdose. In pre-COVID-19 overdose was the 4th top mechanism of donor death, while in the post-COVID-19 era, overdose was the 2nd most common cause of donor death. OD was younger (OD: 35 yrs, IQR 29-43 vs. non-OD: 43 yrs, IQR 31-56), had lower body mass index (≥35 kg/cm2, OD: 31.2% vs. non-OD: 33.5%), and was more likely to be HCV+ (OD: 28.9% vs. non-OD: 5.4%) with lower total bilirubin (≥1.1 mg/dL, OD: 12.9% vs. non-OD: 20.1%) (all p < .001). Receipt of an OD was not associated with worse graft survival (HR .94, 95% CI .88-1.01, p = .09). CONCLUSIONS: Opioid deaths markedly increased following the COVID-19 pandemic, substantially altering the LT donor pool in the United States.


Subject(s)
COVID-19 , Drug Overdose , Liver Transplantation , Humans , United States/epidemiology , Opioid Epidemic , Pandemics , Tissue Donors , COVID-19/epidemiology
7.
Ann Surg Oncol ; 31(5): 3087-3097, 2024 May.
Article in English | MEDLINE | ID: mdl-38347332

ABSTRACT

INTRODUCTION: Data on clinical characteristics and disease-specific prognosis among patients with early onset intrahepatic cholangiocarcinoma (ICC) are currently limited. METHODS: Patients undergoing hepatectomy for ICC between 2000 and 2020 were identified by using a multi-institutional database. The association of early (≤50 years) versus typical onset (>50 years) ICC with recurrence-free (RFS) and disease-specific survival (DSS) was assessed in the multi-institutional database and validated in an external cohort. The genomic and transcriptomic profiles of early versus late onset ICC were analyzed by using the Total Cancer Genome Atlas (TCGA) and Memorial Sloan Kettering Cancer Center databases. RESULTS: Among 971 patients undergoing resection for ICC, 22.7% (n = 220) had early-onset ICC. Patients with early-onset ICC had worse 5-year RFS (24.1% vs. 29.7%, p < 0.05) and DSS (36.5% vs. 48.9%, p = 0.03) compared with patients with typical onset ICC despite having earlier T-stage tumors and lower rates of microvascular invasion. In the validation cohort, patients with early-onset ICC had worse 5-year RFS (7.4% vs. 20.5%, p = 0.002) compared with individuals with typical onset ICC. Using the TCGA cohort, 652 and 266 genes were found to be upregulated (including ATP8A2) and downregulated (including UTY and KDM5D) in early versus typical onset ICC, respectively. Genes frequently implicated as oncogenic drivers, including CDKN2A, IDH1, BRAF, and FGFR2 were infrequently mutated in the early-onset ICC patients. CONCLUSIONS: Early-onset ICC has distinct clinical and genomic/transcriptomic features. Morphologic and clinicopathologic characteristics were unable to fully explain differences in outcomes among early versus typical onset ICC patients. The current study offers a preliminary landscape of the molecular features of early-onset ICC.


Subject(s)
Bile Duct Neoplasms , Cholangiocarcinoma , Humans , Bile Duct Neoplasms/genetics , Bile Duct Neoplasms/surgery , Cholangiocarcinoma/genetics , Cholangiocarcinoma/surgery , Prognosis , Gene Expression Profiling , Hepatectomy , Genomics , Bile Ducts, Intrahepatic/pathology , Minor Histocompatibility Antigens , Histone Demethylases
8.
J Pathol Inform ; 15: 100360, 2024 Dec.
Article in English | MEDLINE | ID: mdl-38292073

ABSTRACT

Hepatocellular carcinoma (HCC) is among the most common cancers worldwide, and tumor recurrence following liver resection or transplantation is one of the highest contributors to mortality in HCC patients after surgery. Using artificial intelligence (AI), we developed an interdisciplinary model to predict HCC recurrence and patient survival following surgery. We collected whole-slide H&E images, clinical variables, and follow-up data from 300 patients with HCC who underwent transplant and 169 patients who underwent resection at the Cleveland Clinic. A deep learning model was trained to predict recurrence-free survival (RFS) and disease-specific survival (DSS) from the H&E-stained slides. Repeated cross-validation splits were used to compute robust C-index estimates, and the results were compared to those obtained by fitting a Cox proportional hazard model using only clinical variables. While the deep learning model alone was predictive of recurrence and survival among patients in both cohorts, integrating the clinical and histologic models significantly increased the C-index in each cohort. In every subgroup analyzed, we found that a combined clinical and deep learning model better predicted post-surgical outcome in HCC patients compared to either approach independently.

9.
Clin Transplant ; 38(1): e15155, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37812571

ABSTRACT

BACKGROUND: Donors with hyperbilirubinemia are often not utilized for liver transplantation (LT) due to concerns about potential liver dysfunction and graft survival. The potential to mitigate organ shortages using such donors remains unclear. METHODS: This study analyzed adult deceased donor data from the United Network for Organ Sharing database (2002-2022). Hyperbilirubinemia was categorized as high total bilirubin (3.0-5.0 mg/dL) and very high bilirubin (≥5.0 mg/dL) in brain-dead donors. We assessed the impact of donor hyperbilirubinemia on 3-month and 3-year graft survival, comparing these outcomes to donors after circulatory death (DCD). RESULTS: Of 138 622 donors, 3452 (2.5%) had high bilirubin and 1999 (1.4%) had very high bilirubin levels. Utilization rates for normal, high, and very high bilirubin groups were 73.5%, 56.4%, and 29.2%, respectively. No significant differences were found in 3-month and 3-year graft survival between groups. Donors with high bilirubin had superior 3-year graft survival compared to DCD (hazard ratio .83, p = .02). Factors associated with inferior short-term graft survival included recipient medical condition in intensive care unit (ICU) and longer cold ischemic time; factors associated with inferior long-term graft survival included older donor age, recipient medical condition in ICU, older recipient age, and longer cold ischemic time. Donors with ≥10% macrosteatosis in the very high bilirubin group were also associated with worse 3-year graft survival (p = .04). DISCUSSION: The study suggests that despite many grafts with hyperbilirubinemia being non-utilized, acceptable post-LT outcomes can be achieved using donors with hyperbilirubinemia. Careful selection may increase utilization and expand the donor pool without negatively affecting graft outcome.


Subject(s)
Liver , Tissue and Organ Procurement , Adult , Humans , Prognosis , Tissue Donors , Graft Survival , Hyperbilirubinemia/etiology , Bilirubin , Retrospective Studies
10.
Surgery ; 175(2): 513-521, 2024 02.
Article in English | MEDLINE | ID: mdl-37980203

ABSTRACT

BACKGROUND: Long-distance-traveling liver grafts in liver transplantation present challenges due to prolonged cold ischemic time and increased risk of ischemia-reperfusion injury. We identified long-distance-traveling liver graft donor and recipient characteristics and risk factors associated with long-distance-traveling liver graft use. METHODS: We conducted a retrospective analysis of data from donor liver transplantation patients registered from 2014 to 2020 in the United Network for Organ Sharing registry database. Donor, recipient, and transplant factors of graft survival were compared between short-travel grafts and long-distance-traveling liver grafts (traveled >500 miles). RESULTS: During the study period, 28,265 patients received a donation after brainstem death liver transplantation and 3,250 a donation after circulatory death liver transplantation. The long-distance-traveling liver graft rate was 6.2% in donation after brainstem death liver transplantation and 7.1% in donation after circulatory death liver transplantation. The 90-day graft survival rates were significantly worse for long-distance-traveling liver grafts (donation after brainstem death: 95.7% vs 94.5%, donation after circulatory death: 94.5% vs 93.9%). The 3-year graft survival rates were similar for long-distance-traveling liver grafts (donation after brainstem death: 85.5% vs 85.1%, donation after circulatory death: 81.0% vs 80.4%). Cubic spline regression analyses revealed that travel distance did not linearly worsen the prognosis of 3-year graft survival. On the other hand, younger donor age, lower donor body mass index, and shorter cold ischemic time mitigated the negative impact of 90-day graft survival in long-distance-traveling liver grafts. CONCLUSION: The use of long-distance-traveling liver grafts negatively impacts 90-day graft survival but not 3-year graft survival. Moreover, long-distance-traveling liver grafts are more feasible with appropriate donor and recipient factors offsetting the extended cold ischemic time. Mechanical perfusion can improve long-distance-traveling liver graft use. Enhanced collaboration between organ procurement organizations and transplant centers and optimized transportation systems are essential for increasing long-distance-traveling liver graft use, ultimately expanding the donor pool.


Subject(s)
Liver Transplantation , Tissue and Organ Procurement , Humans , Liver Transplantation/adverse effects , Retrospective Studies , Living Donors , Tissue Donors , Liver , Risk Factors , Graft Survival
11.
Surgery ; 175(2): 432-440, 2024 02.
Article in English | MEDLINE | ID: mdl-38001013

ABSTRACT

BACKGROUND: We sought to characterize the risk of postoperative complications relative to the surgical approach and overall synchronous colorectal liver metastases tumor burden score. METHODS: Patients with synchronous colorectal liver metastases who underwent curative-intent resection between 2000 and 2020 were identified from an international multi-institutional database. Propensity score matching was employed to control for heterogeneity between the 2 groups. A virtual twins analysis was performed to identify potential subgroups of patients who might benefit more from staged versus simultaneous resection. RESULTS: Among 976 patients who underwent liver resection for synchronous colorectal liver metastases, 589 patients (60.3%) had a staged approach, whereas 387 (39.7%) patients underwent simultaneous resection of the primary tumor and synchronous colorectal liver metastases. After propensity score matching, 295 patients who underwent each surgical approach were analyzed. Overall, the incidence of postoperative complications was 34.1% (n = 201). Among patients with high tumor burden scores, the surgical approach was associated with a higher incidence of postoperative complications; in contrast, among patients with low or medium tumor burden scores, the likelihood of complications did not differ based on the surgical approach. Virtual twins analysis demonstrated that preoperative tumor burden score was important to identify which subgroup of patients benefited most from staged versus simultaneous resection. Simultaneous resection was associated with better outcomes among patients with a tumor burden score <9 and a node-negative right-sided primary tumor; in contrast, staged resection was associated with better outcomes among patients with node-positive left-sided primary tumors and higher tumor burden score. CONCLUSION: Among patients with high tumor burden scores, simultaneous resection of the primary tumor and liver metastases was associated with an increased incidence of postoperative complications.


Subject(s)
Colorectal Neoplasms , Liver Neoplasms , Humans , Colorectal Neoplasms/pathology , Tumor Burden , Hepatectomy/adverse effects , Liver Neoplasms/surgery , Liver Neoplasms/secondary , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Postoperative Complications/surgery , Colectomy/adverse effects , Morbidity , Retrospective Studies
12.
Ann Surg Oncol ; 31(2): 697-700, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37996635

ABSTRACT

Colorectal cancer is the second most common cause of cancer-related death worldwide, and half of patients present with colorectal liver metastasis (CRLM). Liver transplant (LT) has emerged as a treatment modality for otherwise unresectable CRLM. Since the publication of the Lebeck-Lee systematic review in 2022, additional evidence has come to light supporting LT for CRLM in highly selected patients. This includes reports of >10-year follow-up with over 80% survival rates in low-risk patients. As these updated reports have significantly changed our collective knowledge, this article is intended to serve as an update to the 2022 systematic review to include the most up-to-date evidence on the subject.


Subject(s)
Colorectal Neoplasms , Liver Neoplasms , Liver Transplantation , Humans , Hepatectomy , Colorectal Neoplasms/pathology , Liver Neoplasms/secondary , Antineoplastic Combined Chemotherapy Protocols
14.
Transplantation ; 108(2): 498-505, 2024 Feb 01.
Article in English | MEDLINE | ID: mdl-37585345

ABSTRACT

BACKGROUND: The allocation system for livers began using acuity circles (AC) in 2020. In this study, we sought to evaluate the impact of AC policy on the utilization rate for liver transplantation (LT). METHODS: Using the US national registry data between 2018 and 2022, LTs were equally divided into 2 eras: pre-AC (before February 4, 2020) and post-AC (February 4, 2020, and after). Deceased potential liver donors were defined as deceased donors from whom at least 1 organ was procured. RESULTS: The annual number of deceased potential liver donors increased post-AC (from 10 423 to 12 259), approaching equal to that of new waitlist registrations for LT (n = 12 801). Although the discard risk index of liver grafts was comparable between the pre- and post-AC eras, liver utilization rates in donation after brain death (DBD) and donation after circulatory death (DCD) donors were lower post-AC ( P < 0.01; 79.8% versus 83.4% and 23.7% versus 26.0%, respectively). Recipient factors, ie, no recipient located, recipient determined unsuitable, or time constraints, were more likely to be reasons for nonutilization after implementation of the AC allocation system compared to the pre-AC era (20.0% versus 12.3% for DBD donors and 50.1% versus 40.8% for DCD donors). Among non-high-volume centers, centers with lower utilization of marginal DBD donors or DCD donors were more likely to decrease LT volume post-AC. CONCLUSIONS: Although the number of deceased potential liver donors has increased, overall liver utilization among deceased donors has decreased in the post-AC era. To maximize the donor pool for LT, future efforts should target specific reasons for liver nonutilization.


Subject(s)
Liver Transplantation , Tissue and Organ Procurement , Humans , Liver Transplantation/adverse effects , Tissue Donors , Brain Death , Liver , Retrospective Studies , Graft Survival , Death
15.
Ann Surg ; 279(1): 104-111, 2024 01 01.
Article in English | MEDLINE | ID: mdl-37522174

ABSTRACT

OBJECTIVE: To evaluate long-term oncologic outcomes of patients post-living donor liver transplantation (LDLT) within and outside standard transplantation selection criteria and the added value of the incorporation of the New York-California (NYCA) score. BACKGROUND: LDLT offers an opportunity to decrease the liver transplantation waitlist, reduce waitlist mortality, and expand selection criteria for patients with hepatocellular carcinoma (HCC). METHODS: Primary adult LDLT recipients between October 1999 and August 2019 were identified from a multicenter cohort of 12 North American centers. Posttransplantation and recurrence-free survival were evaluated using the Kaplan-Meier method. RESULTS: Three hundred sixty LDLTs were identified. Patients within Milan criteria (MC) at transplantation had a 1, 5, and 10-year posttransplantation survival of 90.9%, 78.5%, and 64.1% versus outside MC 90.4%, 68.6%, and 57.7% ( P = 0.20), respectively. For patients within the University of California San Francisco (UCSF) criteria, respective posttransplantation survival was 90.6%, 77.8%, and 65.0%, versus outside UCSF 92.1%, 63.8%, and 45.8% ( P = 0.08). Fifty-three (83%) patients classified as outside MC at transplantation would have been classified as either low or acceptable risk with the NYCA score. These patients had a 5-year overall survival of 72.2%. Similarly, 28(80%) patients classified as outside UCSF at transplantation would have been classified as a low or acceptable risk with a 5-year overall survival of 65.3%. CONCLUSIONS: Long-term survival is excellent for patients with HCC undergoing LDLT within and outside selection criteria, exceeding the minimum recommended 5-year rate of 60% proposed by consensus guidelines. The NYCA categorization offers insight into identifying a substantial proportion of patients with HCC outside the MC and the UCSF criteria who still achieve similar post-LDLT outcomes as patients within the criteria.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Liver Transplantation , Adult , Humans , Liver Transplantation/methods , Living Donors , Neoplasm Recurrence, Local/etiology , Patient Selection , North America , Retrospective Studies , Treatment Outcome
16.
Clin Transplant ; 38(1): e15213, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38064299

ABSTRACT

BACKGROUND: Outcomes of intestinal transplantation with colon allograft (ICTx) remain controversial. We aimed to assess the outcomes of ICTx in comparison to intestinal transplantation without colon (ITx) using the UNOS/OPTN registry database. METHODS: We retrospectively reviewed 2612 patients who received primary intestinal transplants from 1998 to 2020. The rates of acute rejection (AR) within 6 months after transplant were compared between ICTx and ITx. Risk factors of 6-month AR were examined using logistic regression model by era. Furthermore, conditional graft survival was analyzed to determine long-term outcomes of ICTx. RESULTS: Of 2612 recipients, 506 (19.4%) received ICTx. Graft and patient survival in ICTx recipients were comparable to those in ITx recipients. White ICTx recipients had a higher incidence of AR within 6 months compared to ITx during the entire study period (p = .002), colonic inclusion did not increase the risk of 6-month AR in the past decade. ICTx recipients who experienced 6-month AR had worse graft and patient survival compared to those who did not (p <.001 and p = .004, respectively). Among patients who did not develop 6-month AR, Cox proportional hazard model analysis revealed that colonic inclusion was independently associated with improved conditional graft survival. CONCLUSIONS: In the recent transplant era, colonic inclusion is no longer associated with a heightened risk of 6-month AR and may provide better long-term survival compared to ITx when AR is absent. Risk adjustment for rejection and proper immunosuppressive therapy are crucial to maximize the benefits of colonic inclusion.


Subject(s)
Kidney Transplantation , Humans , Retrospective Studies , Graft Rejection/etiology , Transplantation, Homologous , Graft Survival , Allografts
17.
Surgery ; 175(3): 868-876, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37743104

ABSTRACT

BACKGROUND: We sought to characterize the impact access to gastroenterologists/hepatologists has on liver transplantation listing, as well as time on the liver transplantation waitlist and post-transplant outcomes. METHODS: Liver transplantation registrants aged >18 years between January 1, 2004 and December 31, 2019 were identified from the Scientific Registry of Transplant Recipients Standard Analytic Files. The liver transplantation registration ratio was defined as the ratio of liver transplant waitlist registrations in a given county per 1,000 liver-related deaths. RESULTS: A total of 150,679 liver transplantation registrants were included. Access to liver transplantation centers and liver-specific specialty physicians varied markedly throughout the United States. Of note, the liver transplantation registration ratio was lower in counties with poor access to liver-specific care versus counties with adequate access (poor access 137.2, interquartile range 117.8-163.2 vs adequate access 157.6, interquartile range 127.3-192.2, P < .001). Among patients referred for liver transplantation, the cumulative incidence of waitlist mortality and post-transplant graft survival was comparable among patients with poor versus adequate access to liver-specific care (both P > .05). Among liver transplantation recipients living in areas with poor access, after controlling for recipient and donor characteristics, cold ischemic time, and model for end-stage liver disease score, the area deprivation index predicted graft survival (referent, low area deprivation index; medium area deprivation index, hazard ratio 1.52, 95% confidence interval 1.03-12.23; high area deprivation index, 1.45, 95% confidence interval 1.01-12.09, both P < .05). CONCLUSION: Poor access to liver-specific care was associated with a reduction in liver transplantation registration, and individuals residing in counties with high social deprivation had worse graft survival among patients living in counties with poor access to liver-specific care.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Humans , United States/epidemiology , End Stage Liver Disease/surgery , Severity of Illness Index , Living Donors , Retrospective Studies , Waiting Lists
18.
Surgery ; 175(3): 645-653, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37778970

ABSTRACT

BACKGROUND: Although systemic postoperative therapy after surgery for colorectal liver metastases is generally recommended, the benefit of adjuvant chemotherapy has been debated. We used machine learning to develop a decision tree and define which patients may benefit from adjuvant chemotherapy after hepatectomy for colorectal liver metastases. METHODS: Patients who underwent curative-intent resection for colorectal liver metastases between 2000 and 2020 were identified from an international multi-institutional database. An optimal policy tree analysis was used to determine the optimal assignment of the adjuvant chemotherapy to subgroups of patients for overall survival and recurrence-free survival. RESULTS: Among 1,358 patients who underwent curative-intent resection of colorectal liver metastases, 1,032 (76.0%) received adjuvant chemotherapy. After a median follow-up of 28.7 months (interquartile range 13.7-52.0), 5-year overall survival was 67.5%, and 3-year recurrence-free survival was 52.6%, respectively. Adjuvant chemotherapy was associated with better recurrence-free survival (3-year recurrence-free survival: adjuvant chemotherapy, 54.4% vs no adjuvant chemotherapy, 46.8%; P < .001) but no overall survival significant improvement (5-year overall survival: adjuvant chemotherapy, 68.1% vs no adjuvant chemotherapy, 65.7%; P = .15). Patients were randomly allocated into 2 cohorts (training data set, n = 679, testing data set, n = 679). The random forest model demonstrated good performance in predicting counterfactual probabilities of death and recurrence relative to receipt of adjuvant chemotherapy. According to the optimal policy tree, patient demographics, secondary tumor characteristics, and primary tumor characteristics defined the subpopulation that would benefit from adjuvant chemotherapy. CONCLUSION: A novel artificial intelligence methodology based on patient, primary tumor, and treatment characteristics may help clinicians tailor adjuvant chemotherapy recommendations after colorectal liver metastases resection.


Subject(s)
Chemotherapy, Adjuvant , Colorectal Neoplasms , Liver Neoplasms , Humans , Artificial Intelligence , Chemotherapy, Adjuvant/methods , Colorectal Neoplasms/drug therapy , Colorectal Neoplasms/pathology , Hepatectomy , Liver Neoplasms/drug therapy , Liver Neoplasms/surgery , Neoplasm Recurrence, Local/prevention & control , Neoplasm Recurrence, Local/etiology , Multicenter Studies as Topic , Databases as Topic
20.
Transplantation ; 2023 Nov 22.
Article in English | MEDLINE | ID: mdl-37990355

ABSTRACT

BACKGROUND: With the chronic shortage of donated organs, expanding the indications for liver transplantation (LT) from older donors is critical. Nonalcoholic steatohepatitis (NASH) stands out because of its unique systemic pathogenesis and high recurrence rate, both of which might make donor selection less decisive. The present study aims to investigate the usefulness of old donors in LT for NASH patients. METHODS: The retrospective cohort study was conducted using the Scientific Registry Transplant Recipient database. The cohort was divided into 3 categories according to donor age: young (aged 16-35), middle-aged (36-59), and old donors (60-). Multivariable and Kaplan-Meier analyses were performed to compare the risk of donor age on graft survival (GS). RESULTS: A total of 67 973 primary adult donation-after-brain-death LTs (2002-2016) were eligible for analysis. The multivariable analysis showed a reduced impact of donor age on GS for the NASH cohort (adjusted hazard ratio = 1.13, 95% confidence interval, 1.00-1.27), comparing old to middle-aged donors. If the cohort was limited to NASH recipients plus 1 of the following, recipient age ≥60, body mass index <30, or Model of End Stage Liver Disease score <30, adjusted hazard ratios were even smaller (0.99 [0.84-1.15], 0.92 [0.75-1.13], or 1.04 [0.91-1.19], respectively). Kaplan-Meier analysis revealed no significant differences in overall GS between old- and middle-aged donors in these subgroups (P = 0.86, 0.28, and 0.11, respectively). CONCLUSIONS: Donor age was less influential for overall GS in NASH cohort. Remarkably, old donors were equivalent to middle-aged donors in subgroups of recipient age ≥60, recipient body mass index <30, or Model of End Stage Liver Disease score <30.

SELECTION OF CITATIONS
SEARCH DETAIL
...